# Japanese Pretraining
Roberta Large Japanese
A large Japanese RoBERTa model pretrained on Japanese Wikipedia and the Japanese portion of CC-100, suitable for Japanese natural language processing tasks.
Large Language Model
Transformers Japanese

R
nlp-waseda
227
23
Roberta Base Japanese
A Japanese RoBERTa-based pretrained model, trained on Japanese Wikipedia and the Japanese portion of CC-100.
Large Language Model
Transformers Japanese

R
nlp-waseda
456
32
Featured Recommended AI Models